A new iteration for computing the eigenvalues of semiseparable (plus diagonal) matrices
نویسندگان
چکیده
This paper proposes a new type of iteration based on a structured rank factorization for computing eigenvalues of semiseparable and semiseparable plus diagonal matrices. Also the case of higher order semiseparability ranks is included. More precisely, instead of the traditional QR-iteration, a QH-iteration will be used. The QH-factorization is characterized by a unitary matrix Q and a Hessenberg-like matrix (often called lower semiseparable matrix) H, having the lower triangular part of semiseparable form. The Q factor of this factorization determines the similarity transformation of the QH-method. It will be shown that this iteration is extremely useful for computing the eigenvalues of structured rank matrices. Whereas the traditional QR-method applied onto semiseparable (plus diagonal) and Hessenberg-like matrices uses similarity transformations involving 2p(n− 1) Givens transformations (where p denotes the semiseparability rank), this iteration only needs p(n − 1) iterations, which is comparable to the tridiagonal and Hessenberg situation in case of p = 1. It will also be shown that this method can in some sense be interpreted as an extension of the traditional QR-method for Hessenberg matrices, i.e., the traditional case also fits into this framework. Based on results in another paper, it will be shown that this iteration also exhibits an extra type of convergence behavior, w.r.t. the traditional QR-method. The algorithm will be implemented in an implicit way, based on the Givens-weight representation of the involved structured rank matrices. Numerical experiments will show the viability of the approach. It will be shown that the new approach yields a better complexity and also gives rise to more accurate results.
منابع مشابه
A mathematically simple method based on denition for computing eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices
In this paper, a fundamentally new method, based on the denition, is introduced for numerical computation of eigenvalues, generalized eigenvalues and quadratic eigenvalues of matrices. Some examples are provided to show the accuracy and reliability of the proposed method. It is shown that the proposed method gives other sequences than that of existing methods but they still are convergent to th...
متن کاملA multiple shift QR-step for structured rank matrices
Eigenvalue computations for structured rank matrices are the subject of many investigations nowadays. There exist methods for transforming matrices into structured rank form, QR-algorithms for semiseparable and semiseparable plus diagonal form, methods for reducing structured rank matrices efficiently to Hessenberg form and so forth. Eigenvalue computations for the symmetric case, involving sem...
متن کاملOn the convergence properties of the orthogonal similarity transformations to tridiagonal and semiseparable (plus diagonal) form
On the convergence properties of the orthogonal similarity transformations to tridiagonal and semiseparable (plus diagonal) form. Abstract In this paper, we will compare the convergence properties of three basic reduction methods, by placing them in a general framework. It covers the reduction to tridiagonal, semiseparable and semiseparable plus diagonal form. These reductions are often used as...
متن کاملComputing the Matrix Geometric Mean of Two HPD Matrices: A Stable Iterative Method
A new iteration scheme for computing the sign of a matrix which has no pure imaginary eigenvalues is presented. Then, by applying a well-known identity in matrix functions theory, an algorithm for computing the geometric mean of two Hermitian positive definite matrices is constructed. Moreover, another efficient algorithm for this purpose is derived free from the computation of principal matrix...
متن کاملOrthogonal similarity transformation of a symmetric matrix into a diagonal-plus-semiseparable one with free choice of the diagonal
It is well-known how any symmetric matrix can be transformed into a similar tridiagonal one [1, 2]. This orthogonal similarity transformation forms the basic step for various algorithms. For example if one wants to compute the eigenvalues of a symmetric matrix, one can rst transform it into a similar tridiagonal one and then compute the eigenvalues of this tridiagonal matrix. Very recently an a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007